Rebuilding Factorized Information Criterion: Asymptotically Accurate Marginal Likelihood
نویسندگان
چکیده
Factorized information criterion (FIC) is a recently developed approximation technique for the marginal log-likelihood, which provides an automatic model selection framework for a few latent variable models (LVMs) with tractable inference algorithms. This paper reconsiders FIC and fills theoretical gaps of previous FIC studies. First, we reveal the core idea of FIC that allows generalization for a broader class of LVMs, including continuous LVMs, in contrast to previous FICs, which are applicable only to binary LVMs. Second, we investigate the model selection mechanism of the generalized FIC. Our analysis provides a formal justification of FIC as a model selection criterion for LVMs and also a systematic procedure for pruning redundant latent variables that have been removed heuristically in previous studies. Third, we provide an interpretation of FIC as a variational free energy and uncover a few previously-unknown their relationships. A demonstrative study on Bayesian principal component analysis is provided and numerical experiments support our theoretical results.
منابع مشابه
Factorized Asymptotic Bayesian Inference for Mixture Modeling
This paper proposes a novel Bayesian approximation inference method for mixture modeling. Our key idea is to factorize marginal log-likelihood using a variational distribution over latent variables. An asymptotic approximation, a factorized information criterion (FIC), is obtained by applying the Laplace method to each of the factorized components. In order to evaluate FIC, we propose factorize...
متن کاملFactorized Normalized Maximum Likelihood Criterion for Learning Bayesian Network Structures
This paper introduces a new scoring criterion, factorized normalized maximum likelihood, for learning Bayesian network structures. The proposed scoring criterion requires no parameter tuning, and it is decomposable and asymptotically consistent. We compare the new scoring criterion to other scoring criteria and describe its practical implementation. Empirical tests confirm its good performance.
متن کاملGeneralized Marginal Likelihood for Gaussian mixtures
The dominant approach in Bernoulli-Gaussian myopic deconvolution consists in the joint maximization of a single Generalized Likelihood with respect to the input signal and the hyperparameters. The aim of this correspondence is to assess the theoretical properties of a related Generalized Marginal Likelihood criterion in a simpliied framework where the lter is reduced to identity. Then the outpu...
متن کاملFactorized Asymptotic Bayesian Hidden Markov Models
This paper addresses the issue of model selection for hidden Markov models (HMMs). We generalize factorized asymptotic Bayesian inference (FAB), which has been recently developed for model selection on independent hidden variables (i.e., mixture models), for time-dependent hidden variables. As with FAB in mixture models, FAB for HMMs is derived as an iterative lower bound maximization algorithm...
متن کاملDiscriminative Learning of Bayesian Networks via Factorized Conditional Log-Likelihood
We propose an efficient and parameter-free scoring criterion, the factorized conditional log-likelihood (f̂CLL), for learning Bayesian network classifiers. The proposed score is an approximation of the conditional log-likelihood criterion. The approximation is devised in order to guarantee decomposability over the network structure, as well as efficient estimation of the optimal parameters, achi...
متن کامل